- Data Smoothing
- The use of an algorithm to remove noise from a data set, allowing important patterns to stand out. Data smoothing can be done in a variety of different ways, including random, random walk, moving average, simple exponential, linear exponential and seasonal exponential smoothing. Data smoothing can be used to help predict trends, such as trends in securities prices.
The random walk model is commonly used to describe the behavior of financial instruments such as stocks. Some investors believe that there is no relationship between past movement in a security’s price and its future movement. Random walk smoothing assumes that future data points will equal the last available data point plus a random variable. Technical and fundamental analysts disagree with this idea; they believe future movements can be extrapolated by examining past trends.

*Investment dictionary.
Academic.
2012.*

### Look at other dictionaries:

**data smoothing**— noun The mathematical process of fitting a smooth curve to dispersed data points … Wiktionary**Smoothing**— In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine scale structures/rapid phenomena. Many different… … Wikipedia**Data reduction**— is the transformation of numerical or alphabetical digital information derived empirical or experimentally into a corrected, ordered, and simplified form. Columns and rows are moved around until a diagonal pattern appears, thereby making it easy… … Wikipedia**Smoothing spline**— The smoothing spline is a method of smoothing, or fitting a smooth curve to a set of noisy observations.DefinitionLet (x i,Y i); i=1,dots,n be a sequence of observations, modeled by the relation E(Y i) = mu(x i). The smoothing spline estimate… … Wikipedia**data compression**— Process of reducing the amount of data needed for storage or transmission of a given piece of information (text, graphics, video, sound, etc.), typically by use of encoding techniques. Data compression is characterized as either lossy or lossless … Universalium**Exponential smoothing**— is a technique that can be applied to time series data, either to produce smoothed data for presentation, or to make forecasts. The time series data themselves are a sequence of observations. The observed phenomenon may be an essentially random… … Wikipedia**Numerical smoothing and differentiation**— An experimental datum value can be conceptually described as the sum of a signal and some noise, but in practice the two contributions cannot be separated. The purpose of smoothing is to increase the Signal to noise ratio without greatly… … Wikipedia**Consumption smoothing**— is the economic concept used to express the desire of people for having a stable path of consumption. Since Milton Friedman s permanent income theory (1956) and Modigliani and Brumberg (1954) life cycle model, the idea that agents prefer a stable … Wikipedia**maximum forward rate smoothing**— An alternative yield curve smoothing technique. The most accurate yield curve smoothing method for forward rates. The yield curve with the smoothest possible forward rate function, consistent with observable data, is closely related to but… … Financial and business terms**Savitzky–Golay smoothing filter**— The Savitzky–Golay smoothing filter is a type of filter first described in 1964 by Abraham Savitzky and Marcel J. E. Golay. [A. Savitzky and Marcel J.E. Golay (1964). Smoothing and Differentiation of Data by Simplified Least Squares Procedures .… … Wikipedia